Matrix Manifold Optimization for Gaussian Mixtures
نویسندگان
چکیده
We take a new look at parameter estimation for Gaussian Mixture Model (GMMs). Specifically, we advance Riemannian manifold optimization (on the manifold of positive definite matrices) as a potential replacement for Expectation Maximization (EM), which has been the de facto standard for decades. An out-of-the-box invocation of Riemannian optimization, however, fails spectacularly: it obtains the same solution as EM, but vastly slower. Building on intuition from geometric convexity, we propose a simple reformulation that has remarkable consequences: it makes Riemannian optimization not only match EM (a nontrivial result on its own, given the poor record nonlinear programming has had against EM), but also outperform it in many settings. To bring our ideas to fruition, we develop a welltuned Riemannian LBFGS method that proves superior to known competing methods (e.g., Riemannian conjugate gradient). We hope that our results encourage a wider consideration of manifold optimization in machine learning and statistics.
منابع مشابه
Manifold Constrained Finite Gaussian Mixtures
In many practical applications, the data is organized along a manifold of lower dimension than the dimension of the embedding space. This additional information can be used when learning the model parameters of Gaussian mixtures. Based on a mismatch measure between the Euclidian and the geodesic distance, manifold constrained responsibilities are introduced. Experiments in density estimation sh...
متن کاملnr . IAS - UVA - 02 - 01 Procrustes Analysis to Coordinate Mixtures of Probabilistic Principal Component Analyzers
Mixtures of Probabilistic Principal Component Analyzers can be used to model data that lies on or near a low dimensional manifold in a high dimensional observation space, in effect tiling the manifold with local linear (Gaussian) patches. In order to exploit the low dimensional structure of the data manifold, the patches need to be localized and oriented in a low dimensional space, so that 'loc...
متن کاملTHE CMA EVOLUTION STRATEGY BASED SIZE OPTIMIZATION OF TRUSS STRUCTURES
Evolution Strategies (ES) are a class of Evolutionary Algorithms based on Gaussian mutation and deterministic selection. Gaussian mutation captures pair-wise dependencies between the variables through a covariance matrix. Covariance Matrix Adaptation (CMA) is a method to update this covariance matrix. In this paper, the CMA-ES, which has found many applications in solving continuous optimizatio...
متن کاملRefining Gaussian mixture model based on enhanced manifold learning
Gaussian mixture model (GMM) has been widely used for data analysis in various domains including text documents, face images and genes. GMM can be viewed as a simple linear superposition of Gaussian components, each of which represents a data cluster. Recent models, namely Laplacian regularized GMM (LapGMM) and locally consistent GMM (LCGMM) have been proposed to preserve the than the original ...
متن کاملMixtures of multivariate power exponential distributions.
An expanded family of mixtures of multivariate power exponential distributions is introduced. While fitting heavy-tails and skewness have received much attention in the model-based clustering literature recently, we investigate the use of a distribution that can deal with both varying tail-weight and peakedness of data. A family of parsimonious models is proposed using an eigen-decomposition of...
متن کامل